Local receptive field constrained deep networks

نویسندگان

  • Diána Turcsány
  • Andrzej Bargiela
  • Tomás Maul
چکیده

Automatic extraction of distinctive features from a visual information stream is challenging due to the large amount of information contained in most image data. In recent years deep neural networks (DNNs) have gained outstanding popularity for solving visual information processing tasks. This study reports novel contributions, including a new DNN architecture and training method, which increase the fidelity of DNN-based representations to encodings extracted by visual processing neurons. Our local receptive field constrained DNNs (LRF-DNNs) are pre-trained with a modified restricted Boltzmann machine, the LRFRBM, which utilizes biologically inspired Gaussian receptive field constraints to encourage the emergence of local features. Moreover, we propose a method for concurrently finding advantageous receptive field centers, while training the LRF-RBM. By utilizing LRF-RBMs with gradually increasing receptive field sizes on each layer, our LRF-DNN learns features of increasing complexity and demonstrates hierarchical part-based compositionality. We show superior face completion and reconstruction results on the challenging LFW face dataset.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Deep learning models of biological visual information processing

Improved computational models of biological vision can shed light on key processes contributing to the high accuracy of the human visual system. Deep learning models, which extract multiple layers of increasingly complex features from data, achieved recent breakthroughs on visual tasks. This thesis proposes such flexible data-driven models of biological vision and also shows how insights regard...

متن کامل

Stochastic Training of Graph Convolutional Networks

Graph convolutional networks (GCNs) are powerful deep neural networks for graph-structured data. However, GCN computes nodes’ representation recursively from their neighbors, making the receptive field size grow exponentially with the number of layers. Previous attempts on reducing the receptive field size by subsampling neighbors do not have any convergence guarantee, and their receptive field...

متن کامل

A practical theory for designing very deep convolutional neural networks

Going deep is essential for deep learning. However it is not easy, there are many ways of going deep but most of them are ineffective. In this work, we propose two novel constrains in the design of deep structure to guarantee the performance gain when going deep. Firstly, for each convolutional layer, its capacity of learning more complex patterns should be guaranteed; Secondly, the receptive f...

متن کامل

Stochastic Training of Graph Convolutional Networks with Variance Reduction

Graph convolutional networks (GCNs) are powerful deep neural networks for graph-structured data. However, GCN computes the representation of a node recursively from its neighbors, making the receptive field size grow exponentially with the number of layers. Previous attempts on reducing the receptive field size by subsampling neighbors do not have a convergence guarantee, and their receptive fi...

متن کامل

Understanding the Effective Receptive Field in Deep Convolutional Neural Networks

We study characteristics of receptive fields of units in deep convolutional networks. The receptive field size is a crucial issue in many visual tasks, as the output must respond to large enough areas in the image to capture information about large objects. We introduce the notion of an effective receptive field, and show that it both has a Gaussian distribution and only occupies a fraction of ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • Inf. Sci.

دوره 349-350  شماره 

صفحات  -

تاریخ انتشار 2016